Histogram

In machine vision terminology, a histogram refers to an array of pixels - how many and their individual greyscale values - within a region of interest (ROI). The In-Sight Histogram tool's functions are used to count pixels for the purposes of statistical analysis and feature classification.

In-Sight vision systems perform histogram analysis using the Histogram tool's functions:

  • ExtractColorHistogram: Extracts a color histogram array from a ROI.
  • ExtractHistogram: Extracts a greyscale histogram array from a ROI.
  • HistContrast: Computes the contrast within the ROI.
  • HistCount: Counts the number of pixels within a range of greyscale values.
  • HistHead: Finds the lowest greyscale value within a range.
  • HistHeadPercentage: Returns the greyscale value that is greater than the specified Percentage of pixels in the specified range of the histogram.
  • HistMax: Finds the most prevalent greyscale value within a range.
  • HistMean: Computes the greyscale mean value.
  • HistMin: Finds the least prevalent greyscale value within a range.
  • HistSDev: Computes the greyscale standard deviation.
  • HistSum: Computes the sum of pixel greyscales.
  • HistSumSquare: Computes the sum of pixel greyscales, squared.
  • HistTail: Finds the highest greyscale value within a range.
  • HistTailPercentage: Returns the greyscale value that is less than the specified Percentage of pixels in the specified range of the histogram.
  • HistThresh: Computes an optimum binary threshold.